Detailed Garment Recovery from a Single-View Image

نویسندگان

  • Shan Yang
  • Tanya Ambert
  • Zherong Pan
  • Ke Wang
  • Licheng Yu
  • Tamara L. Berg
  • Ming C. Lin
چکیده

Most recent garment capturing techniques rely on acquiring multiple views of clothing, which may not always be readily available, especially in the case of pre-existing photographs from the web. As an alternative, we propose a method that is able to compute a 3D model of a human body and its outfit from a single photograph with little human interaction. Our algorithm is not only able to capture the global shape and geometry of the clothing, it can also extract small but important details of cloth, such as occluded wrinkles and folds. Unlike previous methods using full 3D information (i.e. depth, multi-view images, or sampled 3D geometry), our approach achieves detailed garment recovery from a single-view image by using statistical, geometric, and physical priors and a combination of parameter estimation, semantic parsing, shape recovery, and physics-based cloth simulation. We demonstrate the effectiveness of our algorithm by re-purposing the reconstructed garments for virtual try-on and garment transfer applications and for cloth animation for digital characters.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

DeepGarment : 3D Garment Shape Estimation from a Single Image

3D garment capture is an important component for various applications such as free-view point video, virtual avatars, online shopping, and virtual cloth fitting. Due to the complexity of the deformations, capturing 3D garment shapes requires controlled and specialized setups. A viable alternative is image-based garment capture. Capturing 3D garment shapes from a single image, however, is a chal...

متن کامل

Garment Modeling from a Single Image

Modeling of realistic garments is essential for online shopping and many other applications including virtual characters. Most of existing methods either require a multi-camera capture setup or a restricted mannequin pose. We address the garment modeling problem according to a single input image. We design an all-pose garment outline interpretation, and a shading-based detail modeling algorithm...

متن کامل

Health problems among garment factory workers: A narrative literature review

Background: Garment factories in India contribute to the economic growth and it is the second largest sector for employment. Many unskilled laborers from rural location work in this sector. The common jobs handled by them are sewing, ironing, packing and lifting heavy loads which are monotonous, continuous and prolonged. Working for a long period of time without rest, absence of personal protec...

متن کامل

The Effects of Compression-Garment Pressure on Recovery After Strenuous Exercise.

Compression garments are frequently used to facilitate recovery from strenuous exercise. PURPOSE To identify the effects of 2 different grades of compression garment on recovery indices after strenuous exercise. METHODS Forty-five recreationally active participants (n = 26 male and n = 19 female) completed an eccentric-exercise protocol consisting of 100 drop jumps, after which they were ma...

متن کامل

Future Trends in Perception and Manipulation for Unfolding and Folding Garments

This paper presents current approaches for robotic garment foldingoriented 3D deformable object perception and manipulation. A major portion of these approaches are based on 3D perception algorithms that match garments to a model, and are thus model-based. They require a full view of an extended garment, in order to then apply a preprogrammed folding sequence. Other approaches are based on 3D m...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1608.01250  شماره 

صفحات  -

تاریخ انتشار 2016